Distributed Optimization for Non-Strongly Convex Regularizers

نویسندگان

  • Simone Forte
  • Thomas Hofmann
  • Martin Jaggi
  • Matthias Seeger
  • Virginia Smith
چکیده

We develop primal-dual algorithms for distributed training of linear models in the Spark framework. We present the ProxCoCoA+ method which represents a generalization of the CoCoA+ algorithm and extends it to the case of general strongly convex regularizers. A primal-dual convergence rate analysis is provided along with an experimental evaluation of the algorithm on the problem of elastic net regularized logistic regression. We also develop the PrimalCoCoA+ method, a method that allows certain non-strongly convex regularizers to be trained in the ProxCoCoA+ theoretical framework; the algorithm works under the assumption that this regularizers are linearly separable and box constrained. This allows for primal-dual convergence rates for L1 regularized models, which are, to the best of our knowledge, the first of their kind; we also evaluate the practical efficiency of this method in the case of L1 regularized logistic on two real world datasets. Finally, we experimentally explore and prove the validity of ProxCoCoA+ Wild and PrimalCoCoA+ Wild, two new optimization methods that combine distributed and parallel optimization techniques and achieve significant speed-ups with respect to their non-wild variants.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

CoCoA: A General Framework for Communication-Efficient Distributed Optimization

The scale of modern datasets necessitates the development of efficient distributed optimization methods for machine learning. We present a general-purpose framework for the distributed environment, CoCoA, that has an efficient communication scheme and is applicable to a wide variety of problems in machine learning and signal processing. We extend the framework to cover general non-strongly conv...

متن کامل

High-dimensional Inference via Lipschitz Sparsity-Yielding Regularizers

Non-convex regularizers are more and more applied to high-dimensional inference with sparsity prior knowledge. In general, the nonconvex regularizer is superior to the convex ones in inference but it suffers the difficulties brought by local optimums and massive computation. A ”good” regularizer should perform well in both inference and optimization. In this paper, we prove that some non-convex...

متن کامل

Primal-Dual convex optimization in large deformation diffeomorphic registration with robust regularizers

This paper proposes a method for primal-dual convex optimization in variational Large Deformation Diffeomorphic Metric Mapping (LDDMM) problems formulated with robust regularizers and image similarity metrics. The method is based on Chambolle and Pock primal-dual algorithm for solving general convex optimization problems. Diagonal preconditioning is used to ensure the convergence of the algorit...

متن کامل

A Comparison of Algorithms for Learning with Nonconvex Regularization

Convex regularizers is popular for sparse and low-rank learning, mainly due to their nice statistical and optimization guarantee. However, they often lead to bias estimation, thus the sparsity and accuracy is not as good as desired. This motivates replacement of convex regularizers with nonconvex one, and recently many nonconvex regularizers have been proposed and indeed better performance than...

متن کامل

Analysis and Implementation of an Asynchronous Optimization Algorithm for the Parameter Server

This paper presents an asynchronous incremental aggregated gradient algorithm and its implementation in a parameter server framework for solving regularized optimization problems. The algorithm can handle both general convex (possibly non-smooth) regularizers and general convex constraints. When the empirical data loss is strongly convex, we establish linear convergence rate, give explicit expr...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2016